Skip to content

Blog post about agentic pipeline using valkey #290

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 6 commits into
base: main
Choose a base branch
from

Conversation

vitarb
Copy link

@vitarb vitarb commented Jul 1, 2025

Description

This PR adds a full end-to-end agentic demo built on Valkey, showcasing a real-time, stream-driven pipeline for ingesting, enriching, and distributing news content to personalized user feeds. It includes:

A Twitter-style agentic architecture implemented with Valkey Streams, Lua scripting, and Prometheus metrics.

GPU‑accelerated enrichment pipeline using HuggingFace Transformers.

Real-time observability via Grafana dashboards.

Fully working Docker Compose setup with support for both CPU and GPU execution.

A blog post documenting the system design, performance characteristics, and real-world debugging lessons.

Issues Resolved

Closes valkey-io/valkey-doc#320

Check List

Commits are signed per the DCO using --signoff

By submitting this pull request, I confirm that my contribution is made under the terms of the BSD-3-Clause License.

Copy link
Member

@stockholmux stockholmux left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can't review this as-is - it lacks the requisite frontmatter meta data, so I can tell it hasn't been previewed. Please make sure your post works in zola before submitting a PR (see the README)

@@ -0,0 +1,177 @@
# Lightning-Fast Agent Messaging with Valkey
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing front matter.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also - you need a bio.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know, left that for the end :)

@stockholmux
Copy link
Member

stockholmux commented Jul 23, 2025

Also: please fix the DCO issue.

**4. Readers fell behind during spikes**
Fixed 50 pops/sec couldn’t keep up with 10k users. Self-tuning delay (`latest_uid * POP_RATE`) scaled up to 200 pops/sec.

All fixes are now defaults in the repo.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

repository. Let's avoid shorthand. Also, this is the first mention of the repo. I would introduce it earlier in the blog.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

First mention now reads valkey‑agentic‑demo repository with hyperlink in intro.


Modern applications are moving beyond monoliths into distributed fleets of specialized agents—small programs that sense, decide, and act in real-time. When hundreds of these interact, their messaging layer must be lightning-fast, observable, and flexible enough to evolve without rewrites.

That requirement led us to **Valkey**: an open-source, community-driven, in-memory database fully compatible with Redis. With streams, Lua scripting, a mature JSON & Search stack, and a lightweight extension system, Valkey provides our agents with a fast, shared nervous system.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Who is "us"? The wording here sounds like a 3rd party considering different datastore options and discovered Valkey. But then later in the blog you say "our roadmap", as a valkey member, which I think is what you mean to refer to.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adopted consistent perspective: we = engineering team using Valkey, not Valkey core maintainers.


Tails the per-user stream and emits new entries directly to the browser.

### Self-Tuning Readers (load generator & consumer)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Stage 5?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added Stage 5 – Reader (self‑tuning load consumer) and included code.

@@ -0,0 +1,177 @@
# Lightning-Fast Agent Messaging with Valkey

## From Tweet to Tailored Feed

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I love this intro so far, because it's highlighting Valkey's strengths and the way it addresses modern application requirements. I would add a bit more context about what you will cover in the blog and what the reader can expect to get out of it. Using your content and the help of AI, I got the following example of what I mean, which I liked:

As modern software moves toward ecosystems of intelligent agents—small, purpose-built programs that sense, decide, and act—the infrastructure underneath needs to keep up. These systems aren’t batch-driven or monolithic. They’re always on, highly parallel, and constantly evolving. What they need is a messaging backbone that’s fast, flexible, and observable by design.

Valkey fits that role perfectly. As a Redis-compatible, in-memory data store, it brings together real-time speeds with a growing suite of tools well-suited for agentic systems. Valkey Streams offer a clean, append-only log for sequencing events. Lua scripting lets you run coordination logic server-side, where it’s faster and easier to manage. JSON and Search support structured data flows without external ETL.

In this post, we’ll put all of that into practice. We’ve built a real-time content pipeline where each news headline is ingested, enriched with topics, routed to interested users, and pushed live to their feeds—all through a series of autonomous agents communicating via Valkey. The result is a low-latency, high-throughput system that’s simple to understand and easy to scale.

You’ll see how the pieces fit together: from the first headline pulled off the wire to the last WebSocket update in the browser. Along the way, we’ll share the design trade-offs, the tuning decisions, and the fixes that helped us hit production-grade speed and reliability. All of it runs in Docker, with GPU offload if you have one, and Grafana dashboards that light up as soon as messages start flowing.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added three‑paragraph preview summarising demo scope and learning outcomes.


---

## Why We Bet on Valkey

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Similar issue as with intro. Sounds like Valkey betting on itself.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Section retitled Why We Picked Valkey for the Job and rewritten in third‑party voice.


## Why We Bet on Valkey

Valkey Streams and consumer groups move messages in <1 ms. Lua keeps fan-out logic server-side. JSON/Search allows enrichment to stay in-memory. Grafana charts latency and backlog immediately. Python agents can be swapped for Rust or Go with no changes to the datastore.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this section is an opportunity to expand on these value propositions of Valkey. Why is Lua scripts running server side useful? What makes JSON in Valkey great? etc.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Table now explains each feature’s concrete value (ordering, zero round‑trip Lua, in‑memory JSON/Search, INFO metrics).


## Observability That Comes Standard

Every agent exports metrics. Grafana's dashboard auto-populates:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Expand. Why did you build in observability? What is the value of each metrics? Is there something specific about Valkey's design that enables this?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Observability table now ties every metric to an operational question.

| p99 Valkey op latency | ≈ 200 µs |
| GPU uplift (A10G) | 5x faster enrichment |

Scaling up? One Docker command. No Helm. No YAML deep dives.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please expand. No short hand

* **Rust agents** using the same Streams API but with lower memory.
* **Auto-provisioned ACLs & metrics** via the MCP server.

Pull requests and fresh ideas welcome.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

link

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Feed UI & Grafana URLs are explicit Markdown links; repository also hyper‑linked.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Super cool overall, thanks for writing this up! Left some comments and suggestions

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, addressed comments, please let me know if you have any other recommendations.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Create demo & blog: Valkey-powered agentic pipeline
3 participants